31 research outputs found

    Developing the human-computer interface for Space Station Freedom

    Get PDF
    For the past two years, the Human-Computer Interaction Laboratory (HCIL) at the Johnson Space Center has been involved in prototyping and prototype reviews of in support of the definition phase of the Space Station Freedom program. On the Space Station, crew members will be interacting with multi-monitor workstations where interaction with several displays at one time will be common. The HCIL has conducted several experiments to begin to address design issues for this complex system. Experiments have dealt with design of ON/OFF indicators, the movement of the cursor across multiple monitors, and the importance of various windowing capabilities for users performing multiple tasks simultaneously

    Human Factors Assessment of Vibration Effects on Visual Performance During Launch

    Get PDF
    The Human Factors Assessment of Vibration Effects on Visual Performance During Launch (Visual Performance) investigation will determine visual performance limits during operational vibration and g-loads on the Space Shuttle, specifically through the determination of minimum readable font size during ascent using planned Orion display formats. Research Summary: The aim of the Human Factors Assessment of Vibration Effects on Visual Performance during Launch (Visual Performance) investigation is to provide supplementary data to that collected by the Thrust Oscillation Seat Detailed Technical Objective (DTO) 695 (Crew Seat DTO) which will measure seat acceleration and vibration from one flight deck and two middeck seats during ascent. While the Crew Seat DTO data alone are important in terms of providing a measure of vibration and g-loading, human performance data are required to fully interpret the operational consequences of the vibration values collected during Space Shuttle ascent. During launch, crewmembers will be requested to view placards with varying font sizes and indicate the minimum readable size. In combination with the Crew Seat DTO, the Visual Performance investigation will: Provide flight-validated evidence that will be used to establish vibration limits for visual performance during combined vibration and linear g-loading. o Provide flight data as inputs to ongoing ground-based simulations, which will further validate crew visual performance under vibration loading in a controlled environment. o Provide vibration and performance metrics to help validate procedures for ground tests and analyses of seats, suits, displays and controls, and human-in-the-loop performance

    Determining Desirable Cursor Control Device Characteristics for NASA Exploration Missions

    Get PDF
    A test battery was developed for cursor control device evaluation: four tasks were taken from ISO 9241-9, and three from previous studies conducted at NASA. The tasks focused on basic movements such as pointing, clicking, and dragging. Four cursor control devices were evaluated with and without Extravehicular Activity (EVA) gloves to identify desirable cursor control device characteristics for NASA missions: 1) the Kensington Expert Mouse, 2) the Hulapoint mouse, 3) the Logitech Marble Mouse, and 4) the Honeywell trackball. Results showed that: 1) the test battery is an efficient tool for differentiating among input devices, 2) gloved operations were about 1 second slower and had at least 15% more errors; 3) devices used with gloves have to be larger, and should allow good hand positioning to counteract the lack of tactile feedback, 4) none of the devices, as designed, were ideal for operation with EVA gloves

    Microgravity cursor control device evaluation for Space Station Freedom workstations

    Get PDF
    This research addressed direct manipulation interface (curser-controlled device) usability in microgravity. The data discussed are from KC-135 flights. This included pointing and dragging movements over a variety of angles and distances. Detailed error and completion time data provided researchers with information regarding cursor control shape, selection button arrangement, sensitivity, selection modes, and considerations for future research

    The effect of on/off indicator design on state confusion, preference, and response time performance, executive summary

    Get PDF
    Investigated are five designs of software-based ON/OFF indicators in a hypothetical Space Station Power System monitoring task. The hardware equivalent of the indicators used in the present study is the traditional indicator light that illuminates an ON label or an OFF label. Coding methods used to represent the active state were reverse video, color, frame, check, or reverse video with check. Display background color was also varied. Subjects made judgments concerning the state of indicators that resulted in very low error rates and high percentages of agreement across indicator designs. Response time measures for each of the five indicator designs did not differ significantly, although subjects reported that color was the best communicator. The impact of these results on indicator design is discussed

    Evidence Report: Risk of Inadequate Human-Computer Interaction

    Get PDF
    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls

    Human Factors Engineering as a System in the Vision for Exploration

    Get PDF
    In order to accomplish NASA's Vision for Exploration, while assuring crew safety and productivity, human performance issues must be well integrated into system design from mission conception. To that end, a two-year Technology Development Project (TDP) was funded by NASA Headquarters to develop a systematic method for including the human as a system in NASA's Vision for Exploration. The specific goals of this project are to review current Human Systems Integration (HSI) standards (i.e., industry, military, NASA) and tailor them to selected NASA Exploration activities. Once the methods are proven in the selected domains, a plan will be developed to expand the effort to a wider scope of Exploration activities. The methods will be documented for inclusion in NASA-specific documents (such as the Human Systems Integration Standards, NASA-STD-3000) to be used in future space systems. The current project builds on a previous TDP dealing with Human Factors Engineering processes. That project identified the key phases of the current NASA design lifecycle, and outlined the recommended HFE activities that should be incorporated at each phase. The project also resulted in a prototype of a webbased HFE process tool that could be used to support an ideal HFE development process at NASA. This will help to augment the limited human factors resources available by providing a web-based tool that explains the importance of human factors, teaches a recommended process, and then provides the instructions, templates and examples to carry out the process steps. The HFE activities identified by the previous TDP are being tested in situ for the current effort through support to a specific NASA Exploration activity. Currently, HFE personnel are working with systems engineering personnel to identify HSI impacts for lunar exploration by facilitating the generation of systemlevel Concepts of Operations (ConOps). For example, medical operations scenarios have been generated for lunar habitation in order to identify HSI requirements for the lunar communications architecture. Throughout these ConOps exercises, HFE personnel are testing various tools and methodologies that have been identified in the literature. A key part of the effort is the identification of optimal processes, methods, and tools for these early development phase activities, such as ConOps, requirements development, and early conceptual design. An overview of the activities completed thus far, as well as the tools and methods investigated will be presented

    Integrating Human Factors into Crew Exploration Vehicle (CEV) Design

    Get PDF
    The purpose of this design process is to apply Human Engineering (HE) requirements and guidelines to hardware/software and to provide HE design, analysis and evaluation of crew interfaces. The topics include: 1) Background/Purpose; 2) HE Activities; 3) CASE STUDY: Net Habitable Volume (NHV) Study; 4) CASE STUDY: Human Modeling Approach; 5) CASE STUDY: Human Modeling Results; 6) CASE STUDY: Human Modeling Conclusions; 7) CASE STUDY: Human-in-the-Loop Evaluation Approach; 8) CASE STUDY: Unsuited Evaluation Results; 9) CASE STUDY: Suited Evaluation Results; 10) CASE STUDY: Human-in-the-Loop Evaluation Conclusions; 11) Near-Term Plan; and 12) In Conclusio

    Cursor Control Device Test Battery

    Get PDF
    The test battery was developed to provide a standard procedure for cursor control device evaluation. The software was built in Visual Basic and consists of nine tasks and a main menu that integrates the set-up of the tasks. The tasks can be used individually, or in a series defined in the main menu. Task 1, the Unidirectional Pointing Task, tests the speed and accuracy of clicking on targets. Two rectangles with an adjustable width and adjustable center- to-center distance are presented. The task is to click back and forth between the two rectangles. Clicks outside of the rectangles are recorded as errors. Task 2, Multidirectional Pointing Task, measures speed and accuracy of clicking on targets approached from different angles. Twenty-five numbered squares of adjustable width are arranged around an adjustable diameter circle. The task is to point and click on the numbered squares (placed on opposite sides of the circle) in consecutive order. Clicks outside of the squares are recorded as errors. Task 3, Unidirectional (horizontal) Dragging Task, is similar to dragging a file into a folder on a computer desktop. Task 3 requires dragging a square of adjustable width from one rectangle and dropping it into another. The width of each rectangle is adjustable, as well as the distance between the two rectangles. Dropping the square outside of the rectangles is recorded as an error. Task 4, Unidirectional Path Following, is similar to Task 3. The task is to drag a square through a tunnel consisting of two lines. The size of the square and the width of the tunnel are adjustable. If the square touches any of the lines, it is counted as an error and the task is restarted. Task 5, Text Selection, involves clicking on a Start button, and then moving directly to the underlined portion of the displayed text and highlighting it. The pointing distance to the text is adjustable, as well as the to-be-selected font size and the underlined character length. If the selection does not include all of the underlined characters, or includes non-underlined characters, it is recorded as an error. Task 6, Multi-size and Multi-distance Pointing, presents the participant with 24 consecutively numbered buttons of different sizes (63 to 163 pixels), and at different distances (60 to 80 pixels) from the Start button. The task is to click on the Start button, and then move directly to, and click on, each numbered target button in consecutive order. Clicks outside of the target area are errors. Task 7, Standard Interface Elements Task, involves interacting with standard interface elements as instructed in written procedures, including: drop-down menus, sliders, text boxes, radio buttons, and check boxes. Task completion time is recorded. In Task 8, a circular track is presented with a disc in it at the top. Track width and disc size are adjustable. The task is to move the disc with circular motion within the path without touching the boundaries of the track. Time and errors are recorded. Task 9 is a discrete task that allows evaluation of discrete cursor control devices that tab from target to target, such as a castle switch. The task is to follow a predefined path and to click on the yellow targets along the path

    A comparison of paper and computer procedures in a Shuttle flight environment

    Get PDF
    The Electronic Procedures Experiment (EPROC) was flown as part of the Human Factors Assessment (HFA) experiment aboard the SpaceHab-1/STS-57 mission. EPROC is concerned with future, longer-duration missions which will increasingly rely on electronic procedures since they are more easily launched, updated in-flight, and offer automatic or on-request capabilities not available with paper. A computer-based task simulating a Space Station Propulsion System task was completed by one crewmember. The crewmember performed the task once using paper and once using computer procedures. A soldering and desoldering task was performed by another crewmember. Soldering was completed with paper procedures and desoldering was completed using computer procedures. Objective data was collected during each task session from the computer programs, videotapes, and crew notations in the paper and computer procedures. After each task session, subjective data was collected through the use of a computer-based questionnaire program. Resultant recommendations will be made available to future designers of electronic procedures systems for manned-space missions and other related uses
    corecore